AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
7 billion parameters

# 7 billion parameters

Llammlein 7B
Other
LLäMmlein 7B is a German LLaMA language model with 7 billion parameters, trained from scratch on the German part of the RedPajama V2 dataset based on the adjusted Tinyllama codebase.
Large Language Model Transformers German
L
LSX-UniWue
251
2
Mistral
Other
Mistral 7B is a large language model launched by Mistral AI with 7 billion parameters. It is designed for high efficiency and performance and is suitable for real-time application scenarios that require quick responses.
Large Language Model
M
cortexso
1,768
1
Openhathi 7B Hi V0.1 Base
The first model in the OpenHathi series, based on the Llama2 architecture, supports Hindi, English, and mixed languages, with 7 billion parameters.
Large Language Model Other
O
sarvamai
655
112
Lince Zero
Apache-2.0
LINCE-ZERO is a 7-billion-parameter Spanish instruction-tuned large language model based on Falcon-7B architecture, fine-tuned using 80,000 proprietary instruction data.
Large Language Model Transformers Spanish
L
clibrain
93
48
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase